What “doesn’t work”? Are you getting a specific error when trying to upload to App Store Connect? If so, what is it? Is your simulator generating screenshots that are not the same resolution as the simulated device? Something very strange here, because making App Store screenshots in the simulator works perfectly for virtually everyone else here.
Post
Replies
Boosts
Views
Activity
The error message is correct. The response is an array of objects compatible with your APIResponse structure, but your code tries to decode it as a JSONData structure. So the decoder is expecting to find a top-level JSON object with a field named data, which of course doesn’t exist in the response.
The correct type to pass to the decoder is [APIResponse].self.
I have a string encoded as ascii characters that contains letters and numbers.
Your hex dump, taken as a whole, isn’t an ASCII string. And if it’s supposed to be an Adobe Photoshop brush file then it definitely won’t be a simple string. I don’t know this format but it looks like a binary structure that includes short ASCII tags. Does your question pertain to parsing just the individual tag-like parts from within the larger structure?
Example.) "40" appears as "@D". I was able to print out the ascii codes of these and found it to be [64, 68]
Well 64 decimal is equal to 0x40 (hex), but this isn’t too helpful without knowing details of the file format. How are you sure beforehand that those two bytes are to be interpreted as that value? Do you have the documentation for this file format?
But for those of us curious about the technology, here’s a good introductory video: https://youtu.be/RXJKdh1KZ0w
We sure get all levels of questions in this forum. ;-)
There isn’t one; here’s why.
In your specific case, a 15.3 simulator won’t necessarily help since you already have some devices running 15.3.
Also note the current Xcode beta has a 15.4 simulator, which may be worth a try.
Here’s what I meant by a minimalist code example using “code block” formatting:
NSDictionary *dict = [NSDictionary dictionaryWithObject: @"a" forKey: @"Apple"];
NSAssert(([dict valueForKey: @"Apple"] != nil), @"");
NSAssert(([dict valueForKey: @"apple"] == nil), @"");
(This formatting works only in a top-level reply; as you can see, posting code in a comment doesn’t work well.)
Returning to the clarifying questions from above: On what platform does this occur: iOS? macOS? And what OS: iOS 13? iOS 14? iOS 15? The version of Xcode itself isn’t really important.
How are you verifying that myAppleDictionary contains the exact keys and values you expect?
If you run the code example given above in both of your scenarios, what happens?
On what platform and OS version do you see -[valueForKey:] behaving case insensitive? Can you post (using “code block” formatting) a minimalist code example that demonstrates this? Key/value containers are always supposed to be case-sensitive.
Unfortunately, what you are trying to do won’t work on iOS. Core Bluetooth supports only Bluetooth Low Energy, while the HC-05 module uses Classic Bluetooth, specifically the serial port profile. Those aren’t compatible.
If the question really is: “is there an API to get the phone number of the current device?” then the answer is: no there isn’t, because that would be a massive privacy fail.
I’ll just claim that “exactly the same” included endianness, even if I didn’t think of it at the time. 😉
The context isn’t clear; the OP mentioned “method call back” which could mean the camera vendor provided a library/framework for communicating with the camera and it (presumably/hopefully) returns a blob compatible with its declaration of struct ins_gyro_info for the current platform. If so, great.
(I guessed this was the scenario since struct ins_gyro_info is a C declaration, possibly copied from a library header file.)
But the OP also mentioned Bluetooth and Wi-Fi explicitly, so maybe the app actually needs to connect directly to the camera via a platform API and process the data by referring to a documented wire format, without any help from a library. If so, and if the endianness doesn’t match, then indeed my code example won’t work.
In this latter case, it may be cleanest to implement the needed “marshaling” code in C and then call it from Swift. Hey, that sounds like the seed of the hypothetical camera support library I mentioned earlier.
Assuming the Data object contains bytes laid out exactly the same as struct ins_gyro_info in C/C++, and that structure is imported into Swift, then you can use Swift’s “unsafe” memory API to perform the conversion.
If “unsafe” is a new area for you, I recommend starting by watching Safely Manage Pointers in Swift from WWDC20.
That said, here’s one way to accomplish what you want:
let data = readRawGyroInfoDataFromCameraOrWhatever()
let gyroInfo = data.withUnsafeBytes { buffer in
buffer.load(as: ins_gyro_info.self)
}
How can I access it?
You need to use the Core Location API exactly as documented. You don’t get any location data “for free” just because another app is using it (if I’m parsing the question correctly). What you may get “for free” is accuracy that’s better than you requested, if the device hardware happens to be generating improved accuracy on behalf of another app. But don’t rely on this behavior! Just request the accuracy you need.
(This feels like one of those questions where the right answer is “What are you actually trying to accomplish?”)
Do I have to switch teams in my Xcode?
Yes, since the app now belongs to your client’s team.
Xcode 13.2.1 should work fine with iOS 15.3, after it downloads the necessary files from the device. Plug it in and check the Devices and Simulators window to make sure there’s no big yellow error message. Maybe reboot the Mac and device too.